Spike time displacement-based error backpropagation in convolutional spiking neural networks

نویسندگان

چکیده

In this paper, we introduce a supervised learning algorithm, which avoids backward recursive gradient computation, for training deep convolutional spiking neural networks (SNNs) with single-spike-based temporal coding. The algorithm employs linear approximation to compute the derivative of spike latency respect membrane potential, and it uses neurons piecewise postsynaptic potential reduce computational cost complexity processing. To evaluate performance proposed in architectures, employ SNNs image classification task. For two popular benchmarks MNIST Fashion-MNIST datasets, network reaches accuracies of, respectively, 99.2 $$92.8\%$$ . trade-off between memory storage capacity accuracy is analyzed by applying sets weights: real-valued weights that are updated pass their signs, binary weights, employed feedforward process. We CSNN on datasets obtain acceptable negligible drop (about 0.6 $$0.8\%$$ drops, respectively).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convolutional Spike Timing Dependent Plasticity based Feature Learning in Spiking Neural Networks

Brain-inspired learning models attempt to mimic the cortical architecture and computations performed in the neurons and synapses constituting the human brain to achieve its efficiency in cognitive tasks. In this work, we present convolutional spike timing dependent plasticity based feature learning with biologically plausible leaky-integrate-and-fire neurons in Spiking Neural Networks (SNNs). W...

متن کامل

Training Deep Spiking Neural Networks Using Backpropagation

Deep spiking neural networks (SNNs) hold the potential for improving the latency and energy efficiency of deep neural networks through data-driven event-based computation. However, training such networks is difficult due to the non-differentiable nature of spike events. In this paper, we introduce a novel technique, which treats the membrane potentials of spiking neurons as differentiable signa...

متن کامل

Error-Backpropagation in Networks of Fractionally Predictive Spiking Neurons

We develop a learning rule for networks of spiking neurons where signals are encoded using fractionally predictive spike-coding. In this paradigm, neural output signals are encoded as a sum of shifted power-law kernels. Simple greedy thresholding can compute this encoding, and spike-trains are then exactly the signal’s fractional derivative. Fractionally predictive spike-coding exploits natural...

متن کامل

Error-backpropagation in temporally encoded networks of spiking neurons

For a network of spiking neurons that encodes information in the timing of individual spike-times, we derive a supervised learning rule, SpikeProp, akin to traditional error-backpropagation. With this algorithm, we demonstrate how networks of spiking neurons with biologically reasonable action potentials can perform complex non-linear classification in fast temporal coding just as well as rate-...

متن کامل

Spike-Timing Error Backpropagation in Theta Neuron Networks

The main contribution of this letter is the derivation of a steepest gradient descent learning rule for a multilayer network of theta neurons, a one-dimensional nonlinear neuron model. Central to our model is the assumption that the intrinsic neuron dynamics are sufficient to achieve consistent time coding, with no need to involve the precise shape of postsynaptic currents; this assumption depa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Computing and Applications

سال: 2023

ISSN: ['0941-0643', '1433-3058']

DOI: https://doi.org/10.1007/s00521-023-08567-0